Probability Distributions and Maximum Entropy
نویسنده
چکیده
If we want to assign probabilities to an event, and see no reason for one outcome to occur more often than any other, then the events are assigned equal probabilities. This is called the principle of insufficient reason, or principle of indifference, and goes back to Laplace. If we happen to know (or learn) something about the non-uniformity of the outcomes, how should the assignment of probabilities be changed? There is an extension of the principle of insufficient reason that suggests what to do. It is called the principle of maximum entropy. After defining entropy and computing it in some examples, we will describe this principle and see how it provides a natural conceptual role for many standard probability distributions (normal, exponential, Laplace, Bernoulli). In particular, the normal distribution will be seen to have a distinguishing property among all continuous probability distributions on R that may be simpler for students in an undergraduate probability course to appreciate than the special role of the normal distribution in the central limit theorem. This paper is organized as follows. In Sections 2 and 3, we describe the principle of maximum entropy in three basic examples. The explanation of these examples is given in Section 4 as a consequence of a general result (Theorem 4.3). Section 5 provides further illustrations of the maximum entropy principle, with details largely left to the reader as practice. In Section 6 we state Shannon’s theorem, which characterizes the entropy function on finite sample spaces. Finally, in Section 7, we prove a theorem about positivity of maximum entropy distributions in the presence of suitable constraints, and derive a uniqueness theorem. In that final section entropy is considered on abstract measure spaces, while the earlier part is written in the language of discrete and continuous probability distributions in order to be more accessible to a motivated undergraduate studying probability.
منابع مشابه
Determination of Maximum Bayesian Entropy Probability Distribution
In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.
متن کاملA Note on the Bivariate Maximum Entropy Modeling
Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1 and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...
متن کاملDetermination of Maximum Entropy Probability Distribution via Burg’s Measure of Entropy
Abstract In this paper, we consider the methods of obtaining maximum entropy multivariate distributions via Burg’s measure of entropy under the constraints that the marginal distributions or the marginals and covariance matrix are prescribed. Next, a numerical method is considered for the cases of unavailable closed form of solutions. Finally, the method is illustrated via a numerical example.
متن کاملMaximum Entropy Beyond Selecting Probability Distributions
Traditionally, the Maximum Entropy technique is used to select a probability distribution in situations when several different probability distributions are consistent with our knowledge. In this paper, we show that this technique can be extended beyond selecting probability distributions, to explain facts, numerical values, and even types of functional dependence. 1 How Maximum Entropy Techniq...
متن کاملIntroduction to Continuous Entropy
Classically, Shannon entropy was formalized over discrete probability distributions. However, the concept of entropy can be extended to continuous distributions through a quantity known as continuous (or differential) entropy. The most common definition for continuous entropy is seemingly straightforward; however, further analysis reveals a number of shortcomings that render it far less useful ...
متن کاملTsallis Maximum Entropy Lorenz Curves
In this paper, at first we derive a family of maximum Tsallis entropy distributions under optional side conditions on the mean income and the Gini index. Furthermore, corresponding with these distributions a family of Lorenz curves compatible with the optional side conditions is generated. Meanwhile, we show that our results reduce to Shannon entropy as $beta$ tends to one. Finally, by using ac...
متن کامل